152 research outputs found

    Multivariate modality inference using Gaussian kernel

    Get PDF
    The number of modes (also known as modality) of a kernel density estimator (KDE) draws lots of interests and is important in practice. In this paper, we develop an inference framework on the modality of a KDE under multivariate setting using Gaussian kernel. We applied the modal clustering method proposed by [1] for mode hunting. A test statistic and its asymptotic distribution are derived to assess the significance of each mode. The inference procedure is applied on both simulated and real data sets

    The topography of multivariate normal mixtures

    Get PDF
    Multivariate normal mixtures provide a flexible method of fitting high-dimensional data. It is shown that their topography, in the sense of their key features as a density, can be analyzed rigorously in lower dimensions by use of a ridgeline manifold that contains all critical points, as well as the ridges of the density. A plot of the elevations on the ridgeline shows the key features of the mixed density. In addition, by use of the ridgeline, we uncover a function that determines the number of modes of the mixed density when there are two components being mixed. A followup analysis then gives a curvature function that can be used to prove a set of modality theorems.Comment: Published at http://dx.doi.org/10.1214/009053605000000417 in the Annals of Statistics (http://www.imstat.org/aos/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Analysis of PET Imaging for Tumor Delineation

    Get PDF
    The primary goal of this is research is to build a statistical framework for automated PET image analysis that is closer to human perception. Although manual interpretation of the PET image is more accurate and reproducible than thresholding-based semiautomatic segmentation methods, human contouring has large interobserver and intraobserver variations and moreover, it is extremely time-consuming. Further, it is harder for humans to analyze more than two dimensions at a time and it becomes even harder if multiple modalities are involved. Moreover, if the task is to analyze a series of images it quickly becomes an onerous job for a single human. The new statistical framework is designed to mimic the human perception for tumour delineation and marry it with all the advan- tages of an analytic method using modern day computing environment

    Functional principal component analysis of spatially correlated data

    Get PDF
    This paper focuses on the analysis of spatially correlated functional data. We propose a parametric model for spatial correlation and the between-curve correlation is modeled by correlating functional principal component scores of the functional data. Additionally, in the sparse observation framework, we propose a novel approach of spatial principal analysis by conditional expectation to explicitly estimate spatial correlations and reconstruct individual curves. Assuming spatial stationarity, empirical spatial correlations are calculated as the ratio of eigenvalues of the smoothed covariance surface Cov (Xi(s),Xi(t))(Xi(s),Xi(t)) and cross-covariance surface Cov (Xi(s),Xj(t))(Xi(s),Xj(t)) at locations indexed by i and j. Then a anisotropy Matérn spatial correlation model is fitted to empirical correlations. Finally, principal component scores are estimated to reconstruct the sparsely observed curves. This framework can naturally accommodate arbitrary covariance structures, but there is an enormous reduction in computation if one can assume the separability of temporal and spatial components. We demonstrate the consistency of our estimates and propose hypothesis tests to examine the separability as well as the isotropy effect of spatial correlation. Using simulation studies, we show that these methods have some clear advantages over existing methods of curve reconstruction and estimation of model parameters

    Analysis of PET Imaging for Tumor Delineation

    Get PDF
    The primary goal of this is research is to build a statistical framework for automated PET image analysis that is closer to human perception. Although manual interpretation of the PET image is more accurate and reproducible than thresholding-based semiautomatic segmentation methods, human contouring has large interobserver and intraobserver variations and moreover, it is extremely time-consuming. Further, it is harder for humans to analyze more than two dimensions at a time and it becomes even harder if multiple modalities are involved. Moreover, if the task is to analyze a series of images it quickly becomes an onerous job for a single human. The new statistical framework is designed to mimic the human perception for tumour delineation and marry it with all the advan- tages of an analytic method using modern day computing environment

    Functional factor analysis for periodic remote sensing data

    Get PDF
    We present a new approach to factor rotation for functional data. This is achieved by rotating the functional principal components toward a predefined space of periodic functions designed to decompose the total variation into components that are nearly-periodic and nearly-aperiodic with a predefined period. We show that the factor rotation can be obtained by calculation of canonical correlations between appropriate spaces which make the methodology computationally efficient. Moreover, we demonstrate that our proposed rotations provide stable and interpretable results in the presence of highly complex covariance. This work is motivated by the goal of finding interpretable sources of variability in gridded time series of vegetation index measurements obtained from remote sensing, and we demonstrate our methodology through an application of factor rotation of this data.Comment: Published in at http://dx.doi.org/10.1214/11-AOAS518 the Annals of Applied Statistics (http://www.imstat.org/aoas/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Quadratic distances on probabilities: A unified foundation

    Full text link
    This work builds a unified framework for the study of quadratic form distance measures as they are used in assessing the goodness of fit of models. Many important procedures have this structure, but the theory for these methods is dispersed and incomplete. Central to the statistical analysis of these distances is the spectral decomposition of the kernel that generates the distance. We show how this determines the limiting distribution of natural goodness-of-fit tests. Additionally, we develop a new notion, the spectral degrees of freedom of the test, based on this decomposition. The degrees of freedom are easy to compute and estimate, and can be used as a guide in the construction of useful procedures in this class.Comment: Published in at http://dx.doi.org/10.1214/009053607000000956 the Annals of Statistics (http://www.imstat.org/aos/) by the Institute of Mathematical Statistics (http://www.imstat.org

    A New Framework for Distance-based Functional Clustering

    Get PDF
    We develop a new framework for clustering functional data, based on a distance matrix similar to the approach in clustering multivariate data using spectral clustering. First, we smooth the raw observations using appropriate smoothing techniques with desired smoothness, through a penalized fit. The next step is to create an optimal distance matrix either from the smoothed curves or their available derivatives. The choice of the distance matrix depends on the nature of the data. Finally, we create and implement the spectral clustering algorithm. We applied our newly developed approach, Functional Spectral Clustering (FSC) on sets of simulated and real data. Our proposed method showed better performance than existing methods with respect to accuracy rates

    A New Framework for Distance-based Functional Clustering

    Get PDF
    We develop a new framework for clustering functional data, based on a distance matrix similar to the approach in clustering multivariate data using spectral clustering. First, we smooth the raw observations using appropriate smoothing techniques with desired smoothness, through a penalized fit. The next step is to create an optimal distance matrix either from the smoothed curves or their available derivatives. The choice of the distance matrix depends on the nature of the data. Finally, we create and implement the spectral clustering algorithm. We applied our newly developed approach, Functional Spectral Clustering (FSC) on sets of simulated and real data. Our proposed method showed better performance than existing methods with respect to accuracy rates

    Reconstruction of f(R,T)f(R,T) gravity model via the Raychaudhuri equation

    Full text link
    In this work, we investigate on an analytical solution for an alternative of modified gravity theory, viz., the f(R,T)f(R,T) gravity for two different eras, i.e., matter and dark energy dominated accelerating universe from completely geometrical and mathematical point of view with the help of the well-known Raychaudhuri equation. To develop the construction of f(R,T)f(R,T) gravity model, we consider the functional form of f(R,T)f(R,T) as the sum of two independent functions of the Ricci scalar RR and the trace of the energy-momentum tensor TT, respectively, under the consideration of a power law expansion of the universe. In this study, the viability, stability and all the energy conditions have been examined. The strong energy condition is not satisfied for our model, which is obvious for the present scenario of the universe.Comment: 14 pages, 27 figure
    corecore